157 research outputs found

    Auditory attention enhances processing of positive and negative words in inferior and superior prefrontal cortex

    Get PDF
    Wegrzyn M, Herbert C, Ethofer T, Flaisch T, Kißler J. Auditory attention enhances processing of positive and negative words in inferior and superior prefrontal cortex. Cortex. 2017;96:31-45

    Results of a pilot study on the involvement of bilateral inferior frontal gyri in emotional prosody perception: an rTMS study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The right hemisphere may play an important role in paralinguistic features such as the emotional melody in speech. The extent of this involvement however is unclear. Imaging studies have shown involvement of both left and right inferior frontal gyri in emotional prosody perception. The present pilot study examined whether these brain areas are critically involved in the processing of emotional prosody and of semantics in 9 healthy subjects. Repetitive transcranial magnetic stimulation was used with a coil centred over left and right inferior frontal gyri, as localized by neuronavigation based on the subject's MRI. A sham condition was included. An online-TMS approach was applied; an emotional language task was completed during stimulation. This computerized task consisted of sentences pronounced by actors. In the semantics condition an emotion (fear, anger or neutral) was expressed in the content pronounced with a neutral intonation. In the prosody condition the emotion was expressed in the intonation, while the content was neutral.</p> <p>Results</p> <p>Reaction times on the emotional prosody task condition were significantly longer after rTMS over both the right and the left inferior frontal gyrus as compared to sham stimulation and after controlling for learning effects associated with order of condition. When taking all emotions together, there was no difference in effect on reaction times between the right and left stimulation. For the emotion Fear, reaction times were significantly longer after stimulating the left inferior frontal gyrus as compared to the right inferior frontal gyrus. Reaction times in the semantics task condition were not significantly different between the three TMS conditions.</p> <p>Conclusions</p> <p>The data indicate a critical involvement of both the right and the left inferior frontal gyrus in emotional prosody perception. The findings of this pilot study need replication. Future studies should include more subjects and examine whether the left and right inferior frontal gyrus play a differential role and complement each other, e.g. in the integrated processing of linguistic and prosodic aspects of speech, respectively.</p

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    It's not what you say but the way that you say it: an fMRI study of differential lexical and non-lexical prosodic pitch processing

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study aims to identify the neural substrate involved in prosodic pitch processing. Functional magnetic resonance imaging was used to test the premise that prosody pitch processing is primarily subserved by the right cortical hemisphere.</p> <p>Two experimental paradigms were used, firstly pairs of spoken sentences, where the only variation was a single internal phrase pitch change, and secondly, a matched condition utilizing pitch changes within analogous tone-sequence phrases. This removed the potential confounder of lexical evaluation. fMRI images were obtained using these paradigms.</p> <p>Results</p> <p>Activation was significantly greater within the right frontal and temporal cortices during the tone-sequence stimuli relative to the sentence stimuli.</p> <p>Conclusion</p> <p>This study showed that pitch changes, stripped of lexical information, are mainly processed by the right cerebral hemisphere, whilst the processing of analogous, matched, lexical pitch change is preferentially left sided. These findings, showing hemispherical differentiation of processing based on stimulus complexity, are in accord with a 'task dependent' hypothesis of pitch processing.</p

    Time Course of the Involvement of the Right Anterior Superior Temporal Gyrus and the Right Fronto-Parietal Operculum in Emotional Prosody Perception

    Get PDF
    In verbal communication, not only the meaning of the words convey information, but also the tone of voice (prosody) conveys crucial information about the emotional state and intentions of others. In various studies right frontal and right temporal regions have been found to play a role in emotional prosody perception. Here, we used triple-pulse repetitive transcranial magnetic stimulation (rTMS) to shed light on the precise time course of involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum. We hypothesized that information would be processed in the right anterior superior temporal gyrus before being processed in the right fronto-parietal operculum. Right-handed healthy subjects performed an emotional prosody task. During listening to each sentence a triplet of TMS pulses was applied to one of the regions at one of six time points (400–1900 ms). Results showed a significant main effect of Time for right anterior superior temporal gyrus and right fronto-parietal operculum. The largest interference was observed half-way through the sentence. This effect was stronger for withdrawal emotions than for the approach emotion. A further experiment with the inclusion of an active control condition, TMS over the EEG site POz (midline parietal-occipital junction), revealed stronger effects at the fronto-parietal operculum and anterior superior temporal gyrus relative to the active control condition. No evidence was found for sequential processing of emotional prosodic information from right anterior superior temporal gyrus to the right fronto-parietal operculum, but the results revealed more parallel processing. Our results suggest that both right fronto-parietal operculum and right anterior superior temporal gyrus are critical for emotional prosody perception at a relatively late time period after sentence onset. This may reflect that emotional cues can still be ambiguous at the beginning of sentences, but become more apparent half-way through the sentence

    Compensatory premotor activity during affective face processing in subclinical carriers of a single mutant Parkin allele

    Get PDF
    Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia–cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia–cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons (‘mirror neurons’) in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia–cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease

    Instrumental Music Influences Recognition of Emotional Body Language

    Get PDF
    In everyday life, emotional events are perceived by multiple sensory systems. Research has shown that recognition of emotions in one modality is biased towards the emotion expressed in a simultaneously presented but task irrelevant modality. In the present study, we combine visual and auditory stimuli that convey similar affective meaning but have a low probability of co-occurrence in everyday life. Dynamic face-blurred whole body expressions of a person grasping an object while expressing happiness or sadness are presented in combination with fragments of happy or sad instrumental classical music. Participants were instructed to categorize the emotion expressed by the visual stimulus. The results show that recognition of body language is influenced by the auditory stimuli. These findings indicate that crossmodal influences as previously observed for audiovisual speech can also be obtained from the ignored auditory to the attended visual modality in audiovisual stimuli that consist of whole bodies and music

    Estimation of metabolite T1 relaxation times using tissue specific analysis, signal averaging and bootstrapping from magnetic resonance spectroscopic imaging data

    Get PDF
    Object A novel method of estimating metabolite T1 relaxation times using MR spectroscopic imaging (MRSI) is proposed. As opposed to conventional single-voxel metabolite T1 estimation methods, this method investigates regional and gray matter (GM)/white matter (WM) differences in metabolite T1 by taking advantage of the spatial distribution information provided by MRSI
    corecore